Permutation Excess Entropy and Mutual Information between the Past and Future

نویسندگان

  • Taichi Haruna
  • Kohei Nakajima
چکیده

We address the excess entropy, which is a measure of complexity for stationary time series, from the ordinal point of view. We show that the permutation excess entropy is equal to the mutual information between two adjacent semi-infinite blocks in the space of orderings for finite-state stationary ergodic Markov processes. This result may shed a new light on the relationship between complexity and anticipation.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Ergodic decomposition of excess entropy and conditional mutual information∗

The article discusses excess entropy defined as mutual information between the past and future of a stationary process. The central result is an ergodic decomposition: Excess entropy is the sum of self-information of shift-invariant σ-field and the average of excess entropies for the ergodic components of the process. The result is derived using generalized conditional mutual information for fi...

متن کامل

The Past and the Future in the Present

Predicting and modeling a system are distinct, but intimately related goals. Leveraging past observations, prediction attempts to make correct statements about what the future will bring, whereas modeling attempts to express the mechanisms behind the observations. In this view, building a model from observations is tantamount to decrypting a system’s hidden organization. The cryptographic view ...

متن کامل

On Hidden Markov Processes with Infinite Excess Entropy

We investigate stationary hidden Markov processes for which mutual information between the past and the future is infinite. It is assumed that the number of observable states is finite and the number of hidden states is countably infinite. Under this assumption, we show that the block mutual information of a hidden Markov process is upper bounded by a power law determined by the tail index of t...

متن کامل

An entropic characterization of long memory stationary process

Long memory or long range dependency is an important phenomenon that may arise in the analysis of time series or spatial data. Most of the definitions of long memory of a stationary process X = {X1, X2, · · · , } are based on the second-order properties of the process. The excess entropy of a stationary process is the summation of redundancies which relates to the rate of convergence of the con...

متن کامل

Information trimming: Sufficient statistics, mutual information, and predictability from effective channel states.

One of the most basic characterizations of the relationship between two random variables, X and Y, is the value of their mutual information. Unfortunately, calculating it analytically and estimating it empirically are often stymied by the extremely large dimension of the variables. One might hope to replace such a high-dimensional variable by a smaller one that preserves its relationship with t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1112.2491  شماره 

صفحات  -

تاریخ انتشار 2011